➀ SK hynix is preparing for mass production of its 16-Hi HBM3E memory; ➁ The company is integrating new equipment and optimizing facilities; ➂ Production tests have begun, with supply expected in the first half of 2025.
Recent #AI GPU news in the semiconductor industry
➀ Elon Musk's xAI startup has ordered NVIDIA GB200 AI servers for $1.08 billion, seeking priority access. ➁ The servers are to be delivered by January 2025 and are expected to enhance xAI's Colossus supercomputer capabilities. ➂ The order is a significant upgrade for xAI's AI computing power.
➀ NVIDIA CEO Jensen Huang requested SK hynix to accelerate the supply of HBM4 memory by six months; ➁ SK hynix originally planned to have its next-gen HBM4 memory ready in the second half of 2025; ➂ NVIDIA currently uses SK hynix's HBM3E memory for its AI chips and plans to use HBM4 in its upcoming Rubin R100 AI GPU.
➀ ByteDance is developing two AI GPUs to be produced on TSMC's 5nm process node; ➁ Expected to enter mass production in 2026; ➂ The move is to reduce reliance on NVIDIA and comply with US export regulations.
➀ Supermicro confirms the delay of NVIDIA's Blackwell B200 AI GPU; ➁ Offers liquid-cooled Hopper H200 AI GPUs as an alternative; ➂ Asserts minimal impact on AI server market.
➀ Samsung is expected to tape-out its next-generation HBM4 memory in Q4 2024. ➁ Mass production of HBM4 is scheduled for Q4 2025, targeting NVIDIA's next-gen Rubin R100 AI GPU. ➂ The AI industry's demand is driving the growth of HBM memory, with SK hynix and Samsung investing heavily in production capabilities.
➀ South Korea's memory chip exports to Taiwan surged 225% in the first half of 2025. ➁ The surge is due to the high demand for HBM memory in AI GPUs. ➂ Taiwan became the third-largest importer of South Korean memory chips, surpassing Vietnam and the United States.
➀ NVIDIA's Blackwell AI GPUs are facing significant design issues requiring a major redesign. ➁ The new B200A AI GPU is being developed to address these issues and is scheduled for release in the second half of 2025. ➂ The B200A will feature a B102 die, 4 stacks of HBM, and will be available in 700W and 1000W HGX form factors with up to 144GB of HBM3E memory.
1. Bernstein analysts have increased the price target for TSMC's shares; 2. TSMC is expected to exceed its 2024 guidance driven by demand from US tech giants and the success of the N3 process node; 3. Continued growth in data center AI revenue and the launch of new smartphones with AI capabilities contribute to TSMC's success.